Nearly Tight Bounds for Testing Function Isomorphism
نویسندگان
چکیده
منابع مشابه
Tight Lower Bounds for Testing Linear Isomorphism
We study lower bounds for testing membership in families of linear/affine-invariant Boolean functions over the hypercube. A family of functions P ⊆ {{0, 1} → {0, 1}} is linear/affine invariant if for any f ∈ P , it is the case that f ◦L ∈ P for any linear/affine transformation L of the domain. Motivated by the recent resurgence of attention to the permutation isomorphism problem, we first focus...
متن کاملNearly Tight Bounds for Wormhole Routing
We present nearly tight bounds f o r wormhole routing on Butterfly networks which indicate it is fundamentally different from store-and-forward packet routing. For instance, consider the problem of routing N log N (randomly generated) log N length messages from the inputs to the outputs of an N input Butterfly. We show that with high probability that this must take time at least fl(10g3 N/(logl...
متن کاملTight Bounds for Graph Homomorphism and Subgraph Isomorphism
We prove that unless Exponential Time Hypothesis (ETH) fails, deciding if there is a homomorphism from graph G to graph H cannot be done in time |V (H)|o(|V (G)|). We also show an exponential-time reduction from Graph Homomorphism to Subgraph Isomorphism. This rules out (subject to ETH) a possibility of |V (H)|o(|V (H)|)-time algorithm deciding if graph G is a subgraph of H. For both problems o...
متن کاملTight Bounds for Subgraph Isomorphism and Graph Homomorphism
We prove that unless Exponential Time Hypothesis (ETH) fails, deciding if there is a homomorphism from graph G to graph H cannot be done in time |V (H)|o(|V . Combined with the reduction of Cygan, Pachocki, and Soca la, our result rules out (subject to ETH) a possibility of |V (G)|o(|V -time algorithm deciding if graph H is a subgraph of G. For both problems our lower bounds asymptotically matc...
متن کاملNearly-tight VC-dimension bounds for piecewise linear neural networks
We prove new upper and lower bounds on the VC-dimension of deep neural networks with the ReLU activation function. These bounds are tight for almost the entire range of parameters. Letting W be the number of weights and L be the number of layers, we prove that the VC-dimension is O(WL log(W )), and provide examples with VC-dimension Ω(WL log(W/L)). This improves both the previously known upper ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: SIAM Journal on Computing
سال: 2013
ISSN: 0097-5397,1095-7111
DOI: 10.1137/110832677